IS

Liang, Qianhui

Topic Weight Topic Terms
0.257 architecture scheme soa distributed architectures layer discuss central difference coupled service-oriented advantages standard loosely table
0.225 set approach algorithm optimal used develop results use simulation experiments algorithms demonstrate proposed optimization present
0.123 response responses different survey questions results research activities respond benefits certain leads two-stage interactions study

Focal Researcher     Coauthors of Focal Researcher (1st degree)     Coauthors of Coauthors (2nd degree)

Note: click on a node to go to a researcher's profile page. Drag a node to reallocate. Number on the edge is the number of co-authorships.

Datta, Anindya 1 Dutta, Kaushik 1 VanderMeer, Debra 1
caching 1 service-oriented architecture 1 SOA 1 XML 1

Articles (1)

SOA Performance Enhancement Through XML Fragment Caching. (Information Systems Research, 2012)
Authors: Abstract:
    Organizations are increasingly choosing to implement service-oriented architectures to integrate distributed, loosely coupled applications. These architectures are implemented as services, which typically use XMLbased messaging to communicate between service consumers and service providers across enterprise networks. We propose a scheme for caching fragments of service response messages to improve performance and service quality in service-oriented architectures. In our fragment caching scheme, we decompose responses into smaller fragments such that reusable components can be identified and cached in the XML routers of an XML overlay network within an enterprise network. Such caching mitigates processing requirements on providers and moves content closer to users, thus reducing bandwidth requirements on the network as well as improving service times. We describe the system architecture and caching algorithm details for our caching scheme, develop an analysis of the expected benefits of our scheme, and present the results of both simulation and case studybased experiments to show the validity and performance improvements provided by our caching scheme. Our simulation experimental results show an up to 60% reduction in bandwidth consumption and up to 50% response time improvement. Further, our case study experiments demonstrate that when there is no resource bottleneck, the cache-enabled case reduces average response times by 40%-50% and increases throughput by 150% compared to the no-cache and full message caching cases. In experiments contrasting fragment caching and full message caching, we found that full message caching provides benefits when the number of possible unique responses is low while the benefits of fragment caching increase as the number of possible unique responses increases. These experimental results clearly demonstrate the benefits of our approach.